4,450 research outputs found

    The Extraction of the Gluon Density from Jet Production in Deeply Inelastic Scattering

    Get PDF
    The prospects of a direct extraction of the proton's gluon density in next-to-leading order via jet rates in deeply inelastic scattering are studied. The employed method is based on the Mellin transform, and can be applied, in principle, to all infra-red-safe observables of hadronic final states. We investigate the dependence of the error band on the extracted gluon distribution on the statistical and systematic error of the data.Comment: 5 pages (Latex); 2 figures are included via epsfig; contribution to the workshop ``Future Physics at HERA'' at DESY, Hamburg, 1995/96; to be published in the proceedings; compressed postscript version also available at http://wwwcn.cern.ch/~graudenz/publications.htm

    The Mellin Transform Technique for the Extraction of the Gluon Density

    Get PDF
    A new method is presented to determine the gluon density in the proton from jet production in deeply inelastic scattering. By using the technique of Mellin transforms not only for the solution of the scale evolution equation of the parton densities but also for the evaluation of scattering cross sections, the gluon density can be extracted in next-to-leading order QCD. The method described in this paper is, however, more general, and can be used in situations where a repeated fast numerical evaluation of scattering cross sections for varying parton distribution functions is required.Comment: 13 pages (LaTeX); 2 figures are included via epsfig; the corresponding postscript files are uuencode

    Re-Examination of Possible Bimodality of GALLEX Solar Neutrino Data

    Full text link
    The histogram formed from published capture-rate measurements for the GALLEX solar neutrino experiment is bimodal, showing two distinct peaks. On the other hand, the histogram formed from published measurements derived from the similar GNO experiment is unimodal, showing only one peak. However, the two experiments differ in run durations: GALLEX runs are either three weeks or four weeks (approximately) in duration, whereas GNO runs are all about four weeks in duration. When we form 3-week and 4-week subsets of the GALLEX data, we find that the relevant histograms are unimodal. The upper peak arises mainly from the 3-week runs, and the lower peak from the 4-week runs. The 4-week subset of the GALLEX dataset is found to be similar to the GNO dataset. A recent re-analysis of GALLEX data leads to a unimodal histogram.Comment: 14 pages, 8 figure

    Poverty Redemption: Why Those Affected Stay Affected

    Get PDF
    This paper looks at why those in poverty have not taken advantage of self-sufficiency programs that are offered through Miami Valley Works along with other self-sufficiency programs offered. This paper looks at the general reasons why those in poverty do not go through self-sufficiency programs, whether they chose not to participate or leave a program prior to completion. This paper delves into multiple factors that could contribute to why an individual would forgo to participate in the program or would choose to leave the program before completing it. The study examines how governmental policies, the culture of poverty, race, housing, education, and stigma affect the individuals choosing to, or not to, participate in the programs offered, specifically through Miami Valley Works. The question arises as to how organizations can keep their retention rate higher in order to increase the number of people in poverty that have the means to achieve self-sufficiency. Those in poverty in the community have expressed interest in involvement in self-sufficiency programs, however enrollment rates decrease considerably throughout the stages of the process, including inquiry, orientation, and the program itself. This study will ideally shed light on how organizations can better reach out to those suffering with poverty in the Dayton community and make sure to achieve high rates of program participation. The ultimate goal is to achieve a better understanding of the barriers that those in poverty face when trying to achieve self-sufficiency

    Information Theoretical Analysis of the Uniqueness of Iris Biometrics

    Get PDF
    With the rapid globalization of technology in the world, the need for a more reliable and secure online method of authentication is required. This can be achieved by using each individual’s distinctive biometric identifiers, such as the face, iris, fingerprint, palmprint, etc.; however, there is a bound to the uniqueness of each identifier and consequently, a limit to the capacity that a biometric recognition system can sustain before false matches occur. Therefore, knowing the limitations on the maximum population that a biometric modality can uniquely represent is essential now more than ever. In an effort to address the general problem, we turn to the use of iris biometrics to measure its uniqueness. The measure of iris uniqueness was first introduced by John Daugman in 2003 and its analysis since then remained an open research problem. Daugman defines uniqueness as the ability to enroll more and more classes into a recognition system while the probability of collision among the classes remains fixed and near zero. Due to errors while collecting these datasets (such as occlusions, illumination conditions, camera noise, motion, and out-of-focus blur) and quality degradation from any signal processing of the iris data, even the highest in-quality datasets will not approach a perfect zero probability of collision. Because of this, we appeal to techniques presented in information theory to analyze and find the maximum possible population the system can support while also measuring the quality of the iris data present in the datasets themselves. The focus of this work is divided into two new techniques to find the maximum population of an iris database: finding the limitations of Daugman\u27s widely accepted IrisCode and proposing a new methodology leveraging the raw iris data. Firstly, Daugman\u27s IrisCode is defined as binary templates representing each independent class present in the database. Through the assumption that a one-to-one encoding technique is available to map the IrisCode of each class to a new binary codeword with the length determined by the degrees of freedom inferred from the distribution of distances between each pair of independent class IrisCodes, we can appeal to Rate-Distortion Theory (limits of error-correcting codes) to establish bounds on the maximum population the IrisCode algorithm can sustain using the minimum Hamming distance (HD) between codewords as a quality metric. Our second approach leverages an Autoregressive (AR) model to estimate each iris class\u27s distinctive power spectral densities and then assume a similar one-to-one mapping of each iris class to a unique Gaussian codeword. A Gaussian Sphere Packing Bound is invoked to realize the maximum population of the dataset and measure the iris quality dependent on the noise present in the data. Another bound, the Daugman-like Bound, is developed that uses the relative entropy between models of classes as a distance metric, like Hamming distance, to find the maximum population given a fixed recognition error for the system. Using these two approaches, we hope to help researchers understand the limitations present in their recognition system depending on the quality of their iris database

    Changes in marsh nekton communities along the salinity gradient of the river Schelde: preliminary results (poster)

    Get PDF
    In the salt marshes different habitats as well as the environmental parameters influence the composition of the nekton community. Considering both the environmental parameters and the different habitats, our sampling campaign occurred in five marshes along the River Schelde (Grembergen, Saeftinghe, Waarde, Zuidgors and Zwin). The salinity gradient differed from 0.2 till 29 psu. Sampling campaign occurred from April until October in 2000.Three different sampling techniques were used adapting to the different habitats. Fyke nets were set in the big creeks. Block net sampled the small creeks by closing the mouth of the creek at the moment of the high water and fish traps were placed into small ponds. Data analysis is still under progress but already remarkable differences were observed.The eel, Anguilla anguilla dominate the fish community in the fresh water marsh. This area is characterised by low catch.Mesohaline and oligohaline marshes are dominated by two fish species, Pleuronectes flesus and Dicentrarchus labrax. High catch of fyke net marks these marshes. In the small creeks, adult Palaemonetes varians and Pleuronectes flesus was captured by block net. Larvae of Pomatoschistus microps and adult Palaemonetes varians appeared in high number in the fish traps.Polyhaline marsh was characterised by low catch with all the different sampling techniques. Only the shore crab, Carcinus maenas was present in very high abundance in the creeks. But in the autumn high numbers of Pomatochistus microps were caught in the small channel

    Habitat value of a developing estuarine brackish marsh

    Get PDF
    Marsh creation receives worldwide attention in mitigating loss of coastal wetlands and in management retreat of estuaries. In the Westerschelde, the former Selena Polder, south of "Het Verdronken Land van Saeftinghe", evolved into the Sieperda marsh after several dyke breaches. Soon after the tides regained access to the polder, a tidal creek was formed. Ten years after, a developing marsh system is found close to a mature marsh system. This situation offered the rare and unique opportunity to compare under similar circumstances the utilisation by nekton species of a natural mature marsh with a recently created developing marsh. Between April and October 1999 both the mature Saeftinghe marsh and the developing Sieperda marsh were sampled every six weeks on two consecutive days. Each sampling occasion covered the whole tidal cycle. The most important environmental parameters (water height, temperature, salinity, turbidity and dissolved oxygen) were similar in both marsh creeks. A distinct difference in nekton community structure between the two marshes was observed. Total biomass and densities of nekton species were remarkably higher in Saeftinghe. In Saeftinghe, a density peak occurred in July and was mainly due to large numbers of the mysids Neomysis integer. In Sieperda, peak densities in September were caused by high abundances of the mysids Mesopodopsis slabberi. This difference in species dominance was observed during all sampling occasions. Biomass peaked in July in the mature marsh and in October in the developing marsh. Mysid shrimp (Neomysis integer) and fish (mainly Pomatoschistus microps) were the main contributors to the biomass in the natural marsh. Herring, sprat (Clupeidae) and shore crab (Carcinus maenas) were more important in Sieperda. For Pomatoschistus microps, distinct differences in length-frequency distributions were noted between both marshes. While creek morphology plays an important role, the development stage of a marsh is believed to be a prime factor in determining the habitat function of creek systems of developing and mature marshes

    Highly Sensitive Gamma-Spectrometers of GERDA for Material Screening: Part 2

    Full text link
    The previous article about material screening for GERDA points out the importance of strict material screening and selection for radioimpurities as a key to meet the aspired background levels of the GERDA experiment. This is directly done using low-level gamma-spectroscopy. In order to provide sufficient selective power in the mBq/kg range and below, the employed gamma-spectrometers themselves have to meet strict material requirements, and make use of an elaborate shielding system. This article gives an account of the setup of two such spectrometers. Corrado is located in a depth of 15 m w.e. at the MPI-K in Heidelberg (Germany), GeMPI III is situated at the Gran-Sasso underground laboratory at 3500 m w.e. (Italy). The latter one aims at detecting sample activities of the order ~0.01 mBq/kg, which is the current state-of-the-art level. The applied techniques to meet the respective needs are discussed and demonstrated by experimental results.Comment: Featured in: Proceedings of the XIV International Baksan School "Particles and Cosmology" Baksan Valley, Kabardino-Balkaria, Russia, April 16-21,2007. INR RAS, Moscow 2008. ISBN 978-5-94274-055-9, pp. 233-238; (6 pages, 4 figures

    Results of ultra-low level 71ge counting for application in the Gallex-solar neutrino experiment at the Gran Sasso Underground Physics Laboratory

    Get PDF
    It has been experimentally verified that the Ultra-Low-Level Counting System for the Gallex solar neutrino experiment is capable of measuring the expected solar up silon-flux to plus or minus 12% during two years of operation
    • …
    corecore